The usability of medical devices has increasingly become a focus for many healthcare facilities seeking to reduce the likelihood of error or injury, improve communications of results, and achieve operational cost savings [1,2]. In the U.S., 165 × 106 ultrasound exams are performed with an annual U.S. market of over $1 × 109. This means that improvements to the usability and diagnostic capabilities of ultrasound systems could lead to notable advances in the delivery of healthcare [3].For a premium ultrasound system, navigation and image processing is based around a touch panel, trackball, and various knobs and buttons for individual features. A sonographer is required to use one hand to operate these controls. The sonographer's other hand controls a probe for scanning a patient's internal organs or blood flowing through blood vessels. Ultrasound systems are often used in environments in which other factors are competing for the end user's attention so they must be streamlined and easy to use.In order to gauge the usability of ultrasound systems, three comparable premium systems from different vendors were evaluated in a comparative usability test. The intent was to evaluate scanning workflows for efficiency, effectiveness, and satisfaction, as sonographers use the systems' control panels, touch panels, and monitors. Usability test results reveal assumptions that end users make and inform designers of medical device characteristics that may lead to higher risks of errors and deviations and ultimately, a product recall [4].In order to determine how a particular system performs in relation to similar competitor systems, a comparative usability test can be conducted. This test involves end users of the systems and requires them to complete the same set of tasks using each of the systems.In this study, sonographers with a specialty in abdominal sonography served as usability test participants. They evaluated the three ultrasound systems through simulated but representative scanning tasks. These tasks were based on the standard abdominal exam tasks. A few examples of the types of tasks included taking measurements of organs on 2D images and annotating images with text, arrows, and body markers. The specialty of abdominal sonography was chosen in order to successfully recruit a large group of candidates for participation with similar demographics and experience.This usability test involved 20 participants: it included 10 practicing sonographers who currently use one vendor's ultrasound system (system A) and 10 who currently use another vendor's ultrasound system (system B). Each group evaluated a third system, a new beta ultrasound system, and the system they are not familiar with (i.e., current system A users evaluated the new beta system and system B).The usability test was conducted at a user research facility equipped with a state-of-the-art usability testing lab. The facility had no design or development ties to any of the three ultrasound systems evaluated in this usability study. Human models were involved in the study to act as patients during the usability tests. These patients provided sonographers with real physiology to scan. Both adult males and females, who were generally healthy, fairly easy-to-scan, and with no suspected abnormal anatomy, were recruited as models.At the beginning of a usability session, the participant received training on the beta system and one of the other systems. For each system, a training video was created by Macadamian that featured a sonographer who had experience on that particular system, completing similar tasks with similar features. After watching each video, participants were given the same amount of time to familiarize themselves with each system before the usability tests began. A counterbalancing technique, random order with rotation, was used for the training and usability tests. This technique helps to mitigate carryover effects from participant-to-participant. Carryover effects include fatigue, practice, habituation, sensitization, and adaptation.At the end of every usability test, each participant was asked to complete the system usability scale (SUS) questionnaire and indicate his/her preferred system. Results from the SUS questionnaire provide a clear, definite score (out of 100) regarding perceptions of usability [5]. A score over 68 would be considered above average and scores under 60 represents systems with poor usability [6].To provide objective research results, three specific measures were taken. User experience (UX) professionals who had no ties to the design or development of any of the systems ran the study. An experienced UX Researcher acted as the test facilitator, so as not to guide or provide advice to the test participants. These measures were coupled by the use of identical test protocols for each system.Matched-pair t-tests were used to determine if any significant differences existed between the designs based on usability metrics captured during the tests. Usability metrics captured by UX researchers included: task completion rates, number of errors, number of deviations, and ease-of-use ratings as well as qualitative feedback from participants. The matched pairs consisted of the data from the beta system and the corresponding data from either system A or system B for the same participant.The number of observed errors and task completion rates measured for effective design. Design elements were further understood with the collection of qualitative feedback.A higher rate of successful task completion was observed during the beta system's usability tests (twice as many test tasks were completed with 100% success rate, as compared to the other vendors' results). This was largely due to a more intuitive touch panel design and layout. For example, when annotating an image with text, users were provided with color-coded groupings of annotations that were found to be more straightforward than on the other systems. Participants also stated that annotation and measurement tasks were easier to complete because of the clear labeling of the trackball legend and touch screen controls when performing such actions as adjusting, moving, and deleting items on an image.The design of a simple control panel with related functions grouped closely together enabled sonographers to search less and have more confidence in their actions with less hand gestures around the control panel. Context-aware knobs and buttons that lit up upon the selection of modes confirmed potential choices available to sonographers.For most tasks, there was the same or lower rate of observed errors during the beta system's usability tests, as compared to the other vendors. This was due to an improved control panel layout, and more intuitive labeling, lighting, and positioning of controls.Ultrasound system satisfaction was measured with task-by-task ease-of-use ratings and further understood with qualitative feedback. Overall, the beta system had higher ease-of-use ratings than the other vendors' ratings and received the greatest amount of positive qualitative feedback. While all three systems performed well overall, there were certain features that were more likely to present particular usability challenges. For example, if a system did not allow the ability to adjust calipers or present the annotation packages intuitively, a sonographer's work was slowed down considerably and they rated the task lower in terms of ease-of-use. Similar issues arose when clearing screens from annotations and measurements. Though dual screens are useful, they were not always easy to use and often cut off a portion of a desired image.The number of deviations measured ultrasound system efficiency of use. Across all tasks and systems, there were no notable differences regarding the number of observed deviations. Deviations occurred if a participant, for example, went to a wrong screen, interacted incorrectly with an on screen control, or performed a step to correct a previous error.Despite no significant difference in the number of deviations, there were observations and qualitative feedback that provided insights regarding efficiency. For example, it was found that providing just one option for accessing functions like measurements via the touch panel versus two routes (touch panel and monitor) kept the monitor uncluttered and required less navigation and interaction with the control panel. Participants stated that improving the control panel and its relationship to the touch panel has huge value so that users do not require a physical, QWERTY keyboard. This is a critical finding because a keyboard either takes up space on the control panel or requires the sonographer to move further back from the system to pull out a keyboard drawer.SUS scores of 78 (s = 14) and 88 (s = 11) for the beta system's two groups of participants mean that the participants rated the system above average, as they found the system very usable and learnable. System A scored 66 (s = 26) and system B scored 50 (s = 28) both which rank below average. A system that is more usable and learnable could result in less training time required to become proficient, embodies more efficient and effective workflows, and higher satisfaction of use.Touch panel and monitor displays on ultrasound systems need to be clear and intuitive in order to lead to a positive UX. Busy screens lead to frustrations and errors, as observed during the comparative usability test.Controls for similar tasks should be grouped together on the control panel and touch panel using positioning, framing, and separate pages. For example, the beta system displayed color-coded annotations by such groupings as side, axes, organ, and plane.Another way to improve system usability is with color-coding and context-aware controls. Controls that light up based on the mode selected, as observed on system A and the beta system, allowed sonographers to focus on the task at hand and provided a more effective and efficient workflow.The ability to place annotations in the correct location with better descriptors has a direct impact on the quality of communicating results to other clinicians. It increases comprehension and speed of processing the images.It should be easy for sonographers to interact with the touch panel. Sonographers that have an easier time selecting and adjusting items from the touch panel will have more frequent and better interactions with their ultrasound system.During scanning tasks, navigating the ultrasound system should be simple and efficient. As mentioned previously, there should be visible controls identifying the options available to a sonographer based on the mode selected. Sonographers should be able to move from one scanning task to the next while staying informed of the system status and mode.All of these factors add to better usability, which encourages end users to utilize an ultrasound system more often with increased accuracy, reduced exam time, and improved user and patient satisfaction.